Learning to update Auto-associative Memory in Recurrent Neural Networks for Improving Sequence Memorization
نویسندگان
چکیده
Learning to remember long sequences remains a challenging task for recurrent neural networks. Register memory and attention mechanisms were both proposed to resolve the issue with either high computational cost to retain memory differentiability, or by discounting the RNN representation learning towards encoding shorter local contexts than encouraging long sequence encoding. Associative memory, which studies the compression of multiple patterns in a fixed size memory, were rarely considered in recent years. Although some recent work tries to introduce associative memory in RNN and mimic the energy decay process in Hopfield nets, it inherits the shortcoming of rule-based memory updates, and the memory capacity is limited. This paper proposes a method to learn the memory update rule jointly with task objective to improve memory capacity for remembering long sequences. Also, we propose an architecture that uses multiple such associative memory for more complex input encoding. We observed some interesting facts when compared to other RNN architectures on some well-studied sequence learning tasks.
منابع مشابه
A Self-Reconstructing Algorithm for Single and Multiple-Sensor Fault Isolation Based on Auto-Associative Neural Networks
Recently different approaches have been developed in the field of sensor fault diagnostics based on Auto-Associative Neural Network (AANN). In this paper we present a novel algorithm called Self reconstructing Auto-Associative Neural Network (S-AANN) which is able to detect and isolate single faulty sensor via reconstruction. We have also extended the algorithm to be applicable in multiple faul...
متن کاملAssociative Long Short-Term Memory
We investigate a new method to augment recurrent neural networks with extra memory without increasing the number of network parameters. The system has an associative memory based on complex-valued vectors and is closely related to Holographic Reduced Representations and Long Short-Term Memory networks. Holographic Reduced Representations have limited capacity: as they store more information, ea...
متن کاملA multi-modular associator network for simple temporal sequence learning and generation
Temporal sequence generation readily occurs in nature. For example performing a series of motor movements or recalling a sequence of episodic memories. Proposed networks which perform temporal sequence generation are often in the form of a modification to an auto-associative memory by using heteroassociative or time-varying synaptic strengths, requiring some pre-chosen temporal functions. Intra...
متن کاملMemorization of Melodies by Complex-valued Recurrent Network
Memorization of the temporal sequences is very important and interesting problem. One of the examples is memorization of melodies. When we memorize melodies, we can recall it from a part. The function can be considered as an associative memory of the temporal sequences. From a technical point of view, it can be considered that the rest of the melody can be determined by some leading some notes....
متن کاملNeural Associative Memory for Dual-Sequence Modeling
Many important NLP problems can be posed as dual-sequence or sequence-tosequence modeling tasks. Recent advances in building end-to-end neural architectures have been highly successful in solving such tasks. In this work we propose a new architecture for dual-sequence modeling that is based on associative memory. We derive AM-RNNs, a recurrent associative memory (AM) which augments generic recu...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1709.06493 شماره
صفحات -
تاریخ انتشار 2017